Acta Psychologica Sinica ›› 2015, Vol. 47 ›› Issue (10): 1201-1212.doi: 10.3724/SP.J.1041.2015.01201
Previous Articles Next Articles
WU Binxing; ZHANG Zhijun; SUN Yusheng
Received:
Published:
Online:
Contact:
Abstract:
The interaction between facial gender and emotional expression has been a consistent debate. In some impactful theories about face cognition, such as the theory of Haxby, Hoffman and Gobbini (2000), facial gender and emotional expression are described as processing independently. But there are also empirical evidences to suggest that facial gender and emotional expression process interdependently. Researchers who concerned about the interaction between facial gender and emotional expression seemed to ignore the role of facial familiarity. Based on given evidences of facial familiarity’s effects on the processing of facial gender and on the processing of emotional expression, we hypothesize that facial familiarity will module the relationship between facial gender and emotional expression. The experiments were all based on Garner’s selective attention paradigm (Garner, 1976). The main logic of this paradigm is that, if selective attention is possible in the presence of irrelevant dimension, then the two dimensions under investigation could be declared independent or “separable,” if not, the dimensions are “integrality.” In our experiments, participants were required to make speeded facial gender (or expression) classification to four types of stimuli (angry-male, angry-female, happy-male, happy-female). They were instructed to ignore facial gender (or expression) when making expression (or facial gender) classification. Stimuli were presented in two different conditions termed control and orthogonal conditions, and participants should finish both conditions. In the control condition, stimuli varied along only the relevant dimension and the irrelevant dimension was held constant. In the orthogonal condition, stimuli varied along both the relevant dimension and the irrelevant dimension. In experiment 1 (low facial familiarity), we used 16 strange face stimuli with 16 identities. Each stimulus was presented only once in both control condition and orthogonal condition. With a between-subjects design, half of all the 72 participants made the facial gender classification, and the other half made expression classification. Another 72 participants took part in experiment 2 (medium facial familiarity). All settings of experiment 2 were same as experiment 1, except that the stimuli were another 16 faces with 8 identities and each identity has two expressions. 48 participants attended experiment 3 (high facial familiarity), and face stimuli in this experiment were same as experiment 1, but each face was presented repeatedly in both control condition and orthogonal condition. In experiment 4, 48 participants were trained with neutral faces with same identities as 16 faces used in experiment 1, so that they could get familiar with the stimuli’s identities. Then participants were asked to make same tasks as experiment 1. Overall, this study explored whether or not facial familiarity modulate the interaction between facial gender and emotional expression directly. We applied 2(condition: orthogonal, control)×2(facial gender: male, female)×2(expression: angry, happy) repeatedly measured ANOVA on the reaction times (RTs) of facial gender tasks and expression tasks. In experiment 1, there was a main effect of condition in facial gender task (F(1,35) = 16.07, p < 0.001, ηp2 = 0.32). RTs in the orthogonal condition were significantly lager than RTs in the control condition (namely Garner effect). But in expression task, there wasn’t a significant Garner effect (p > 0.05). This suggested that under low facial familiarity situation, emotional expression had a unidirectional effect on the processing of facial gender. In experiment 2, no significant Garner effect was found in facial gender task (p > 0.05), but the interaction between facial gender and expression reached significance (F(1,35) = 19.35, p < 0.001, ηp2 = 0.36). Angry female faces took longer to be classified as females than did happy female faces, whereas male faces were the contrary. This interaction implied that emotional expression influenced the processing of facial gender. In expression task, there’s a marginal significant Garner effect (F(1,35) = 2.71, p = 0.109, ηp2 = 0.07), facial gender influenced the processing of emotional expression to a certain degree. In experiment 3, there wasn’t a significant Garner effect in facial gender task (p > 0.05), but a significant interaction between facial gender and expression was found again (F(1,23) = 31.46, p < 0.001, ηp2 = 0.58). Participants were slower to judge angry female faces as females than to judge happy female faces as females, but there was no difference in RTs to judging angry and happy male faces as males, this suggested again that expression influenced the processing of facial gender. In expression task, there was a significant Garner effect (F(1,23) = 15.95, p = 0.001, ηp2 = 0.41), it meant that facial gender had an impact on the processing of expression. Results of experiment 2 and 3 suggested that under high facial familiarity situation, facial gender and emotional expression influenced the processing of each other. The experiment 4 obtained almost the same results as experiment 3, proved the research hypothesis directly. In conclusion, for unfamiliar faces, emotional expression had an unidirectional effect on facial gender’s processing, but with the increasing of facial familiarity, a bi-directional interaction arisen between facial gender and emotional expression. The future work may focus on the mechanisms behind the facial familiarity’s modulation effect on the interaction between facial gender and emotional expression.
Key words: facial familiarity, facial gender, emotional expression, Garner’s paradigm
WU Binxing, ZHANG Zhijun, SUN Yusheng. (2015). Facial Familiarity Modulates the Interaction between Facial Gender and Emotional Expression. Acta Psychologica Sinica, 47(10), 1201-1212.
0 / / Recommend
Add to citation manager EndNote|Ris|BibTeX
URL: https://journal.psych.ac.cn/acps/EN/10.3724/SP.J.1041.2015.01201
https://journal.psych.ac.cn/acps/EN/Y2015/V47/I10/1201